Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks
نویسندگان
چکیده
The echo state property is a key for the design and training of recurrent neural networks within the paradigm of reservoir computing. In intuitive terms, this is a passivity condition: a network having this property, when driven by an input signal, will become entrained by the input and develop an internal response signal. This excited internal dynamics can be seen as a high-dimensional, nonlinear, unique transform of the input with a rich memory content. This view has implications for understanding neural dynamics beyond the field of reservoir computing. Available definitions and theorems concerning the echo state property, however, are of little practical use because they do not relate the network response to temporal or statistical properties of the driving input. Here we present a new definition of the echo state property that directly connects it to such properties. We derive a fundamental 0-1 law: if the input comes from an ergodic source, the network response has the echo state property with probability one or zero, independent of the given network. Furthermore, we give a sufficient condition for the echo state property that connects statistical characteristics of the input to algebraic properties of the network connection matrix. The mathematical methods that we employ are freshly imported from the young field of nonautonomous dynamical systems theory. Since these methods are not yet well known in neural computation research, we introduce them in some detail. As a side story, we hope to demonstrate the eminent usefulness of these methods.
منابع مشابه
Multi-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks
Modelling and forecasting Stock market is a challenging task for economists and engineers since it has a dynamic structure and nonlinear characteristic. This nonlinearity affects the efficiency of the price characteristics. Using an Artificial Neural Network (ANN) is a proper way to model this nonlinearity and it has been used successfully in one-step-ahead and multi-step-ahead prediction of di...
متن کاملA local Echo State Property through the largest Lyapunov exponent
Echo State Networks are efficient time-series predictors, which highly depend on the value of the spectral radius of the reservoir connectivity matrix. Based on recent results on the mean field theory of driven random recurrent neural networks, enabling the computation of the largest Lyapunov exponent of an ESN, we develop a cheap algorithm to establish a local and operational version of the Ec...
متن کاملThe Application of Multi-Layer Artificial Neural Networks in Speckle Reduction (Methodology)
Optical Coherence Tomography (OCT) uses the spatial and temporal coherence properties of optical waves backscattered from a tissue sample to form an image. An inherent characteristic of coherent imaging is the presence of speckle noise. In this study we use a new ensemble framework which is a combination of several Multi-Layer Perceptron (MLP) neural networks to denoise OCT images. The noise is...
متن کاملSpiral Recurrent Neural Network for Online Learning
Autonomous, self* sensor networks require sensor nodes with a certain degree of “intelligence”. An elementary component of such an “intelligence” is the ability to learn online predicting sensor values. We consider recurrent neural network (RNN) models trained with an extended Kalman filter algorithm based on real time recurrent learning (RTRL) with teacher forcing. We compared the performance ...
متن کاملUsing Echo State Networks for Cryptography
Echo state networks are simple recurrent neural networks that are easy to implement and train. Despite their simplicity, they show a form of memory and can predict or regenerate sequences of data. We make use of this property to realize a novel neural cryptography scheme. The key idea is to assume that Alice and Bob share a copy of an echo state network. If Alice trains her copy to memorize a m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural computation
دوره 25 3 شماره
صفحات -
تاریخ انتشار 2013